A New Database Generation Method Combiningmaximin Method and Kriging Prediction
نویسندگان
چکیده
The numerical simulation of eddy-current non-destructive testing methods involves high complexity and expensive computational load. However, one needs to reach reliable solutions for these problems in order to be able, in particular, to solve the related inverse problem. To overcome this difficulty a new approach recently appeared. The main idea here is to build a problem-specific database (for a given ECT application), containing the calculated probe responses (“data”) for well-chosen defect configurations (at certain values of “defect parameters”). See for instance, [1, 2]. Once this database is built, the end-user of the ECT technique just has to use this database and a computationally cheap method yields the sought defect parameters from the database, knowing the measured data. However, the construction of such databases is not a simple task. The database should be “optimal” in some sense. Optimality is a rather general property; in our context; it means that the database achieves a prescribed precision when one applies a given interpolation method (based on the stored data points), while the size of the database remains as small as possible. Nowadays, the generation of optimal databases is quite challenging and an intensively studied field in non-destructive evaluation. To the best of our knowledge, the most preferred approach of database generation in this field so far is the application of mesh generation methods (mostly originated from finite-element meshing techniques), see for instance, [1, 2]. The present contribution provides a new methodology of optimal database generation. Let us introduce the following notation: the n entry of the database consists of a Ddimensional parameter vector pn (the defect is characterized by D parameters), and an Mdimensional data vector zn (zn can be complex, e.g., the impedance variation of a probe coil). The parameter vectors, which relate to all possible defects that one expects to find in the given experiment, span the parameter space P. The data vectors, which are related to the parameter vectors in the parameter space, span the data space Z. The basic idea of our method is to construct a database, whose data points zn (n=1, 2, ..., N) are as spread as possible over Z. In other words, the data space Z is uniformly sampled by the database. The proposed methodology is a combination of a maximin method (a well-known tool in decision theory, game theory, statistics) and a kriging prediction (a stochastic method for interpolating functions, based on some observed function values, by using a Gaussian process to model the functions). A close idea appears (in a quite different context) in [3]. However, the methodology presented in [3] does not work in the case of high-dimensional data spaces, like in our ECT application (note that M might be several hundred). Our algorithm is an incremental procedure, which means that the database is being built by inserting new entries (parameter and the corresponding data vector) one-by-one. The maximin theory is used to determine the next parameter vector to be inserted in order to get a new data vector in the data space which is as far as possible from its nearest neighbor. Since the distance between the data vector to be inserted and the data vectors have already been added is not known a priori, we use the kriging prediction of the distance relations in the data space. To illustrate the method, numerical simulations will be discussed. A comparison of the performances of the databases generated by this method and by the “classical” methods will be carried out. Preliminary results show that the proposed algorithm spreads the data vectors on the data space better than some conventional sampling methods. Another aim of these simulations is to explore and point out possible pitfalls and limitations of this database generation method. AcknowledgmentsThis research is partially supported by the French ANR through the “INDIAC”projectand by the DIGITEO cluster’s project N° 2008-15D. Laboratoire des Signaux et Systèmes etSUPELEC are founding members of the DIGITEO cluster. References1. Sz. Gyimóthy, I. Kiss, J. Pávó, “Adaptive sampling technique based on moving meshesfor building data-equidistant inversion databases for NDT,” OIPE 2008, Sept 14-17, 2008,Ilmenau (Germany), pp. 139-140.2. J. Pávó, Sz. Gyimóthy, “Adaptive inversion database for electromagnetic nondestructiveevaluation,” NDT&E; International, vol. 40, no. 3, 2007, pp. 192-202.3. R. Bettinger, P. Duchêne, L. Pronzato, E. Thierry, “Design of experiments for responsediversity,” J. Phys.: Conf. Ser. 135, 012017, 2008.
منابع مشابه
A new database generation method combining maximin method and kriging prediction for eddy-current testing
The accurate numerical simulation of the eddycurrent testing (ECT) experiments usually requires large computational efforts. To avoid the time-consuming computations, the idea – which is new in the domain of ECT – of using databases appeared recently. The database consists of well-chosen pairs of input-output samples of a specified ECT experiment. Once it is built, one just has to retrieve the ...
متن کاملCombination of Maximin and Kriging Prediction Methods for Eddy-Current Testing Database Generation
The numerical simulation of eddy-current non-destructive testing methods involves high complexity and expensive computational load. However, one needs to reach reliable solutions for these problems in order to be able, in particular, to solve the related inverse problem. A way to overcome such a difficulty is to propose a “surrogate modelization” of ECT experiments through the generation of a p...
متن کاملGroundwater Level Forecasting Using Wavelet and Kriging
In this research, a hybrid wavelet-artificial neural network (WANN) and a geostatistical method were proposed for spatiotemporal prediction of the groundwater level (GWL) for one month ahead. For this purpose, monthly observed time series of GWL were collected from September 2005 to April 2014 in 10 piezometers around Mashhad City in the Northeast of Iran. In temporal forecasting, an artificial...
متن کاملTitle Penalized Blind Kriging in Computer Experiments Complete List of Authors Ying Hung Penalized Blind Kriging in Computer Experiments
Kriging models are popular in analyzing computer experiments. The most widely used kriging models apply a constant mean to capture the overall trend. This method can lead to a poor prediction especially when certain strong trends exist. To tackle this problem, a new modeling method is proposed, which incorporates a variable selection mechanism into kriging via a penalty function. An efficient a...
متن کاملNatural Gas Price Forecasting using Kriging Interpolation Technique and Neldar-Mead Optimization Algorithm
The prediction of economic series with high volatility and high uncertainty - such as natural gas prices - is always a challenge in econometric models, because the use of traditional linear modeling models does not allow us to predict complex and nonlinear time series. Regarding the prediction of natural gas prices, findings point to superiority of the neural network compared to regression mod...
متن کامل